Entropy and Kullback-Leibler divergence estimation based on Szegö's theorem
نویسندگان
چکیده
In this work, a new technique for the estimation of the Shannon’s entropy and the Kullback-Leibler (KL) divergence for one dimensional data is presented. The estimator is based on the Szegö’s theorem for sequences of Toeplitz matrices, which deals with the asymptotic behavior of the eigenvalues of those matrices, and the analogy between a probability density function (PDF) and a power spectral density (PSD), which allows us to estimate a PDF of bounded support using the wellknown spectral estimation techniques. Specifically, an AR model is used for the PDF/PSD estimation, and the entropy is easily estimated as a function of the eigenvalues of the autocorrelation Toeplitz matrix. The performance of the Szegö’s estimators is illustrated by means of Monte Carlo simulations and compared with previously proposed alternatives, showing a good performance.
منابع مشابه
Some statistical inferences on the upper record of Lomax distribution
In this paper, we investigate some inferential properties of the upper record Lomax distribution. Also, we will estimate the upper record of the Lomax distribution parameters using methods, Moment (MME), Maximum Likelihood (MLE), Kullback-Leibler Divergence of the Survival function (DLS) and Baysian. Finally, we will compare these methods using the Monte Carlo simulation.
متن کاملEstimation of the Weibull parameters by Kullback-Leibler divergence of Survival functions
Recently, a new entropy based divergence measure has been introduced which is much like Kullback-Leibler divergence. This entropy measures the distance between an empirical and a prescribed survival function and is a lot easier to compute in continuous distributions than the K-L divergence. In this paper we show that this distance converges to zero with increasing sample size and we apply it to...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملGeneralized Cross-Entropy Methods
The cross-entropy and minimum cross-entropy methods are well-known Monte Carlo simulation techniques for rare-event probability estimation and optimization. In this paper we investigate how these methods can be extended to provide a general non-parametric cross-entropy framework based on φ-divergence distance measures. We show how the χ distance in particular yields a viable alternative to Kull...
متن کاملA general minimax result for relative entropy
Suppose Nature picks a probability measure P on a complete separable metric space X at random from a measurable set P fP g Then without knowing a statistician picks a measure Q on X Finally the statistician su ers a loss D P jjQ the relative entropy between P and Q We show that the minimax and maximin values of this game are always equal and there is always a minimax strategy in the closure of ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009